Skip to content

Conversation

@prateek
Copy link

@prateek prateek commented Apr 11, 2025

This PR adds support for configuring SSL certificate handling when using LLM behind corporate proxies like Zscaler.

Problem

As described in issue #772, users behind corporate proxies or firewalls that perform SSL inspection (like Zscaler) encounter connection errors because the HTTPS certificate validation fails.

Unlike tools like uv (which has a --native-tls option), LLM didn't have a way to configure certificate handling to work in these environments.

Solution

This PR adds environment variables to configure SSL certificate handling:

# Use the system's native certificate store (similar to uv's --native-tls)
export LLM_SSL_CONFIG=native_tls
# Or specify a custom CA bundle
export LLM_CA_BUNDLE=/path/to/cert.pem

The configuration options include:

  • LLM_SSL_CONFIG=native_tls: Use the system's native certificate store
  • LLM_SSL_CONFIG=no_verify: Disable certificate verification (not recommended for production)
  • LLM_CA_BUNDLE=/path/to/cert.pem: Use a custom CA bundle file

Implementation Details

  • Added a helper function _configure_ssl_client that reads environment variables for SSL configuration
  • Added validation for SSL configuration values with helpful warning messages
  • Added certificate file existence checking to prevent silent failures
  • Integrated the helper function with the get_client method
  • Added comprehensive tests for all scenarios and configurations
  • Added detailed documentation in the docs

When using LLM behind corporate proxies or firewalls that perform SSL inspection
(like Zscaler), HTTPS certificate validation can fail with connection errors.

This adds support for two environment variables:
- LLM_SSL_CONFIG: Configure SSL verification behavior ('native_tls' or 'no_verify')
- LLM_CA_BUNDLE: Path to a custom CA certificate bundle file

The implementation adds a helper function that configures the OpenAI client's
HTTP transport based on these settings.

Fixes simonw#772
return input_dict


def _configure_ssl_client(model_id):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function doesn't seem to be using the model_id argument for anything...

Comment on lines +547 to +548
if "http_client" not in kwargs:
kwargs["http_client"] = logging_client()
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn't this mean that LLM_OPENAI_SHOW_RESPONSES won't work if a custom SSL configuration is in use?

try:
if ssl_config == "native_tls":
# Use the system's native certificate store
return DefaultHttpxClient(transport=httpx.HTTPTransport(verify=True))
Copy link

@bissonex bissonex Apr 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is enough to use the system's native certificate store. Have a look at encode/httpx#2490. The problem with the proposed solution, using Truststore, is that it requires Python3.10 or later.

@simonfisher-ss
Copy link

Is there an update with this? It doesn't seem clear if the original PR works to enable native TLS...

@bissonex
Copy link

bissonex commented May 7, 2025

Is there an update with this? It doesn't seem clear if the original PR works to enable native TLS...

I made a draft PR that I am using daily at work with the only caveat that it needs Python 3.10+.

@prateek
Copy link
Author

prateek commented May 17, 2025

Sorry, I got caught up in something and didn't circle back. My DAYJOB finally got back to me about ZScaler - they're updating the Root CA in the next ~2w so I won't need this in a few days.

Going to close it, lets use @bissonex's version if still relevant.

@prateek prateek closed this May 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants